منابع مشابه
On the Estimation of Mutual Information
The mutual information is useful measure of a random vector component dependence. It is important in many technical applications. The estimation methods are often based on the well known relation between the mutual information and the appropriate entropies. In 1999 Darbellay and Vajda [3] proposed a direct estimation methods. In this paper we compare some available estimation methods using diff...
متن کاملOn Classification of Bivariate Distributions Based on Mutual Information
Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...
متن کاملMine: Mutual Information Neural Estimation
We argue that the estimation of the mutual information between high dimensional continuous random variables is achievable by gradient descent over neural networks. This paper presents a Mutual Information Neural Estimator (MINE) that is linearly scalable in dimensionality as well as in sample size. MINE is backpropable and we prove that it is strongly consistent. We illustrate a handful of appl...
متن کاملPitch estimation using mutual information
A spectrotemporal method based on Mutual Information (MI) is proposed for pitch estimation of voiced speech signals. We use MI as the similarity measure between voiced speech segments and their delayed version. Instead of measuring linear dependencies, MI measures statistical dependency, which suits the dynamic characteristic of speech signals. Besides, higher-order statistics are directly enco...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Proceedings
سال: 2020
ISSN: 2504-3900
DOI: 10.3390/proceedings2019033031